What is linear quadratic gaussian?

Linear quadratic Gaussian (LQG) control is a control theory that uses mathematical models to design controllers for dynamic systems. It is a state-feedback control law that consists of a linear differential equation for the state dynamics, a quadratic cost function, and a Gaussian noise model. The LQG controller is designed to minimize the expected value of the quadratic cost function, subject to the linear differential equation and Gaussian noise model.

LQG control is a powerful control theory used in many applications, including aerospace, automotive, and robotics. It requires a model of the system to be controlled, and it assumes that the system is linear, Gaussian, and time-invariant. If these assumptions are satisfied, LQG control can provide optimal control performance. However, due to the requirements of a model, LQG control is less suitable for non-linear and time-varying systems.

To design an LQG controller, one needs to obtain a mathematical model of the system to be controlled. This model could be derived from first principles or from experimental data. Once the system model is obtained, a cost function needs to be defined that takes into account the desired performance of the system. The LQG controller can then be derived by solving the calculus of variations problem for the cost function.

In summary, the LQG control is a powerful control theory that provides optimal control for linear, Gaussian, and time-invariant systems. It requires a model of the system, and it is less suitable for non-linear and time-varying systems.